feat:Add Conversations API; update schemas, response links, and paging#213
feat:Add Conversations API; update schemas, response links, and paging#213
Conversation
WalkthroughAdds a new Conversations API (CRUD and item management), numerous conversation/message/content schemas (-2 variants), updates Response to include conversation linkage, adjusts streaming event paths, revises pagination for list-items, and updates navigation/examples—all within openapi.yaml. Changes
Sequence Diagram(s)sequenceDiagram
autonumber
actor Client
participant API as API
participant Conv as Conversations Service
participant Resp as Responses Service
rect rgb(245,248,255)
note over Client,API: New: Create a conversation with initial items
Client->>API: POST /conversations {items[], metadata}
API->>Conv: CreateConversationRequest
Conv-->>API: Conversation (id, metadata, timestamps)
API-->>Client: 201 Conversation
end
rect rgb(245,255,245)
note over Client,API: New: Create a response linked to a conversation
Client->>API: POST /responses {conversation: {id}, ...}
API->>Resp: Create Response (conversation_id)
Resp-->>API: Response {conversation: {id}}
API-->>Client: 200 Response (includes conversation)
end
rect rgb(255,248,240)
note over Client,Conv: Updated streaming paths
Client-->>API: Subscribe stream
API-->>Client: event: response/output_text/delta
API-->>Client: event: response/output_text/done
end
sequenceDiagram
autonumber
actor Client
participant API as API
participant Conv as Conversations Service
rect rgb(245,245,255)
note over Client,Conv: New: Item listing with after/limit/order/include (no "before")
Client->>API: GET /conversations/{id}/items?limit&order&after&include
API->>Conv: ListItems(params)
Conv-->>API: ConversationItemList {data[], first_id, last_id, has_more}
API-->>Client: 200 ConversationItemList
end
Estimated code review effort🎯 4 (Complex) | ⏱️ ~60–90 minutes Poem
Tip 🔌 Remote MCP (Model Context Protocol) integration is now available!Pro plan users can now connect to remote MCP servers from the Integrations page. Connect with popular remote MCPs such as Notion and Linear to add more context to your reviews and chats. ✨ Finishing Touches🧪 Generate unit tests
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. CodeRabbit Commands (Invoked using PR/Issue comments)Type Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Actionable comments posted: 8
🧹 Nitpick comments (9)
src/libs/tryAGI.OpenAI/openapi.yaml (9)
1133-1134: Fix endpoint description: the server generates the ID, not the client."Create a conversation with the given ID." is misleading for POST /conversations (no path ID). Recommend "Create a new conversation."
- description: Create a conversation with the given ID. + description: Create a new conversation.
1141-1147: Add 201 for creates and standard error responses across new endpoints.Return 201 for resource creation and document common 4xx/5xx responses for consistency and client handling.
responses: - '200': + '201': description: OK content: application/json: schema: $ref: '#/components/schemas/ConversationResource' + '400': + $ref: '#/components/responses/BadRequest' + '401': + $ref: '#/components/responses/Unauthorized' + '404': + $ref: '#/components/responses/NotFound' + '429': + $ref: '#/components/responses/RateLimitExceeded' + '500': + $ref: '#/components/responses/ServerError'Apply analogous error responses to retrieve/update/delete/list/create-item/get-item endpoints.
Also applies to: 1181-1186, 1219-1224, 1263-1268, 1328-1333, 1387-1392, 1433-1438, 1485-1490
1151-1159: Unify examples: some SDK samples omit the request body; others include it.For consistency and to reduce user confusion, either show minimal-empty and full examples for each SDK, or make them all pass consistent metadata/items.
Would you like me to generate harmonized examples per language?
1321-1327: Bound the include[] array size (Checkov CKV_OPENAPI_21) and clarify encoding.Add maxItems to prevent unbounded query growth and specify style/explode for arrays.
- name: include in: query description: "Specify additional output data to include in the model response. Currently supported values are: ... " + style: form + explode: false schema: type: array + maxItems: 16 items: $ref: '#/components/schemas/Includable'
1447-1447: Node.js delete example parameter order differs from other SDKs.Others use (conversation_id, item_id); Node sample uses (item_id, { conversation_id }). Verify intended Node signature and align examples accordingly.
5729-5735: Minor: unify placeholder IDs across examples.Curl/other samples use response_id; JS uses "resp_123". Standardize placeholders for clarity.
-const response = await client.responses.inputItems.list("resp_123"); +const response = await client.responses.inputItems.list("response_id");
11275-11286: Inconsistent schema reference: CreateConversationRequest uses Metadata, not MetadataParam as summarized.Decide on a single metadata type for create vs update to avoid client confusion; update summary or spec accordingly.
- metadata: - $ref: '#/components/schemas/Metadata' + metadata: + $ref: '#/components/schemas/MetadataParam'(Or keep Metadata here and switch UpdateConversationBody to Metadata for symmetry; pick one and document rationale.)
17844-17896: Consider enforcing at least one content item for a Message.Empty content arrays are likely invalid for messages.
content: type: array + minItems: 1 items: anyOf: - $ref: '#/components/schemas/InputTextContent-2' - $ref: '#/components/schemas/OutputTextContent-2' - $ref: '#/components/schemas/TextContent' - $ref: '#/components/schemas/SummaryTextContent' - $ref: '#/components/schemas/RefusalContent-2' - $ref: '#/components/schemas/InputImageContent-2' - $ref: '#/components/schemas/ComputerScreenshotContent' - $ref: '#/components/schemas/InputFileContent-2' discriminator: propertyName: type description: The content of the message
22053-22059: instructions array variant could use bounds.For better UX and safety, set minItems and a reasonable maxItems (e.g., 20).
- title: Input item list type: array items: $ref: '#/components/schemas/InputItem' + minItems: 1 + maxItems: 20 description: "A list of one or many input items to the model, containing different content types. "
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
💡 Knowledge Base configuration:
- MCP integration is disabled by default for public repositories
- Jira integration is disabled by default for public repositories
- Linear integration is disabled by default for public repositories
You can enable these sources in your CodeRabbit configuration.
⛔ Files ignored due to path filters (157)
src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI..JsonSerializerContext.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.ConversationsClient.CreateConversation.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.ConversationsClient.CreateConversationItems.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.ConversationsClient.DeleteConversation.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.ConversationsClient.DeleteConversationItem.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.ConversationsClient.GetConversation.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.ConversationsClient.GetConversationItem.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.ConversationsClient.ListConversationItems.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.ConversationsClient.UpdateConversation.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.ConversationsClient.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.IConversationsClient.CreateConversation.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.IConversationsClient.CreateConversationItems.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.IConversationsClient.DeleteConversation.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.IConversationsClient.DeleteConversationItem.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.IConversationsClient.GetConversation.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.IConversationsClient.GetConversationItem.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.IConversationsClient.ListConversationItems.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.IConversationsClient.UpdateConversation.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.IConversationsClient.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.IOpenAiClient.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.IResponsesClient.ListInputItems.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.Annotation2.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.ComputerScreenshotContentType.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.ComputerScreenshotContentTypeNullable.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.ContainerFileCitationBody2Type.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.ContainerFileCitationBody2TypeNullable.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.ContentItem.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.Conversation.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.ConversationItem.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.ConversationItemListObject.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.ConversationItemListObjectNullable.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.ConversationResourceObject.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.ConversationResourceObjectNullable.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.DeletedConversation.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.DeletedConversationResourceObject.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.DeletedConversationResourceObjectNullable.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.FileCitationBody2Type.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.FileCitationBody2TypeNullable.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.InputFileContent2Type.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.InputFileContent2TypeNullable.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.InputImageContent2Detail.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.InputImageContent2DetailNullable.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.InputImageContent2Type.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.InputImageContent2TypeNullable.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.InputTextContent2Type.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.InputTextContent2TypeNullable.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.ListConversationItemsOrder.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.ListConversationItemsOrderNullable.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.MessageRole.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.MessageRoleNullable.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.MessageStatus.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.MessageStatusNullable.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.MessageType.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.MessageTypeNullable.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.OutputTextContent2Type.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.OutputTextContent2TypeNullable.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.RefusalContent2Type.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.RefusalContent2TypeNullable.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.SummaryTextContentType.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.SummaryTextContentTypeNullable.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.TemplateItem.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.TextContentType.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.TextContentTypeNullable.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.UrlCitationBody2Type.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonConverters.UrlCitationBody2TypeNullable.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.JsonSerializerContextTypes.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.Annotation2.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.Annotation2.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.Annotation2Discriminator.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.Annotation2Discriminator.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.ComputerScreenshotContent.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.ComputerScreenshotContent.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.ComputerScreenshotContentType.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.ContainerFileCitationBody2.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.ContainerFileCitationBody2.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.ContainerFileCitationBody2Type.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.ContentItem.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.ContentItem.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.Conversation.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.Conversation.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.Conversation2.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.Conversation2.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.ConversationItem.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.ConversationItem.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.ConversationItemDiscriminator.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.ConversationItemDiscriminator.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.ConversationItemList.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.ConversationItemList.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.ConversationItemListObject.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.ConversationParam.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.ConversationParam.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.ConversationResource.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.ConversationResource.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.ConversationResourceMetadata.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.ConversationResourceMetadata.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.ConversationResourceObject.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.CreateConversationItemsRequest.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.CreateConversationItemsRequest.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.CreateConversationRequest.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.CreateConversationRequest.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.CreateEvalCompletionsRunDataSourceInputMessagesVariant1.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.CreateResponseVariant3.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.DeletedConversation.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.DeletedConversation.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.DeletedConversationResource.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.DeletedConversationResource.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.DeletedConversationResourceObject.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.FileCitationBody2.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.FileCitationBody2.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.FileCitationBody2Type.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.InputFileContent2.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.InputFileContent2.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.InputFileContent2Type.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.InputImageContent2.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.InputImageContent2.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.InputImageContent2Detail.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.InputImageContent2Type.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.InputTextContent2.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.InputTextContent2.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.InputTextContent2Type.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.ListConversationItemsOrder.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.LogProb2.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.LogProb2.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.Message.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.Message.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.MessageContentItemDiscriminator.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.MessageContentItemDiscriminator.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.MessageRole.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.MessageStatus.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.MessageType.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.MetadataParam.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.MetadataParam.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.OutputTextContent2.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.OutputTextContent2.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.OutputTextContent2Type.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.RefusalContent2.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.RefusalContent2.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.RefusalContent2Type.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.ResponseProperties.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.ResponseVariant3.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.ResponseVariant3Status.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.SummaryTextContent.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.SummaryTextContent.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.SummaryTextContentType.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.TemplateItem.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.TextContent.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.TextContent.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.TextContentType.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.TopLogProb2.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.TopLogProb2.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.UpdateConversationBody.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.UpdateConversationBody.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.UrlCitationBody2.Json.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.UrlCitationBody2.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.UrlCitationBody2Type.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.OpenAiClient.g.csis excluded by!**/generated/**src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.ResponsesClient.ListInputItems.g.csis excluded by!**/generated/**
📒 Files selected for processing (1)
src/libs/tryAGI.OpenAI/openapi.yaml(33 hunks)
🧰 Additional context used
🪛 Checkov (3.2.334)
src/libs/tryAGI.OpenAI/openapi.yaml
[MEDIUM] 1324-1328: Ensure that arrays have a maximum number of items
(CKV_OPENAPI_21)
🔇 Additional comments (23)
src/libs/tryAGI.OpenAI/openapi.yaml (23)
7521-7527: LGTM: clear discriminator for Annotation-2.The anyOf + discriminator on type matches the -2 variants.
10192-10224: LGTM: ContainerFileCitationBody-2 looks consistent with other citation schemas.
12630-12637: LGTM: Response request accepts conversation as ID or object; clear description.
13870-13893: LGTM: DeletedConversation and resource variant are well-formed with const object value.
14862-14886: LGTM: FileCitationBody-2 structure is coherent and self-descriptive.
16820-16835: LGTM: InputTextContent-2 is straightforward and consistent.
17543-17562: LGTM: LogProb-2 mirrors TopLogProb-2 and looks correct.
18505-18510: LGTM: MetadataParam constraints are clear and reusable.
18978-19003: Are annotations always present on OutputTextContent-2?If not guaranteed by all producers, make annotations optional to avoid breaking clients.
- required: - - type - - text - - annotations + required: + - type + - text
21991-22006: LGTM: RefusalContent-2 mirrors existing pattern with const type.
22029-22031: LGTM: Response.conversation links to minimal Conversation-2; clean association.
22093-22094: LGTM: Response status description enumerates valid states.
23457-23458: LGTM: Clear incompatibility note with conversation for previous_response_id.
25512-25525: LGTM: SummaryTextContent is minimal and consistent with -2 variants.
25539-25552: LGTM: TextContent has const type and required text.
25815-25830: LGTM: TopLogProb-2 aligns with LogProb-2 structures.
26080-26086: LGTM: UpdateConversationBody uses MetadataParam; matches intent for request params.
26229-26257: LGTM: UrlCitationBody-2 is well-formed and explicit.
29253-29253: LGTM: New Conversations tag added to tags list.
29306-29340: Docs nav added for Conversations; ensure endpoint path keys match.Navigation uses path: create-items; earlier x-oaiMeta used create-item. Once fixed above, nav should resolve correctly.
29373-29377: LGTM: Stream event paths made explicit for response/output_text.
29492-29493: LGTM: Section title clarified.
30594-30594: LGTM: Navigation group renamed to "Responses API".
| - name: limit | ||
| in: query | ||
| description: "A limit on the number of objects to be returned. Limit can range between\n1 and 100, and the default is 20.\n" | ||
| schema: | ||
| type: integer | ||
| default: 20 | ||
| - name: order |
There was a problem hiding this comment.
🛠️ Refactor suggestion
Enforce documented limit range in schema.
Description says 1–100; encode as minimum/maximum for validation and auto-generated clients.
- name: limit
in: query
description: "A limit on the number of objects to be returned. Limit can range between
1 and 100, and the default is 20.
"
schema:
type: integer
default: 20
+ minimum: 1
+ maximum: 100📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| - name: limit | |
| in: query | |
| description: "A limit on the number of objects to be returned. Limit can range between\n1 and 100, and the default is 20.\n" | |
| schema: | |
| type: integer | |
| default: 20 | |
| - name: order | |
| - name: limit | |
| in: query | |
| description: "A limit on the number of objects to be returned. Limit can range between | |
| 1 and 100, and the default is 20. | |
| " | |
| schema: | |
| type: integer | |
| default: 20 | |
| minimum: 1 | |
| maximum: 100 |
🤖 Prompt for AI Agents
In src/libs/tryAGI.OpenAI/openapi.yaml around lines 1301 to 1307, the schema
documents a numeric limit of 1–100 in the description but lacks explicit
validation; add JSON Schema validation by adding minimum: 1 and maximum: 100 to
the relevant numeric property declaration so generated clients and validators
enforce the documented range (also update any enum/format if needed to remain
consistent).
| title: Add a user message to a conversation | ||
| group: conversations | ||
| name: Create items | ||
| path: create-item | ||
| returns: "Returns the list of added [items](https://platform.openai.com/docs/api-reference/conversations/list-items-object).\n" |
There was a problem hiding this comment.
Align x-oaiMeta path key with navigation (create-items vs create-item).
The section nav (lines 29323–29327) uses path: create-items but here it’s create-item. This mismatch can break docs linking.
- path: create-item
+ path: create-itemsAlso consider renaming the title to reflect multi-item support:
- title: Add a user message to a conversation
+ title: Create items in a conversation🤖 Prompt for AI Agents
In src/libs/tryAGI.OpenAI/openapi.yaml around lines 1405 to 1409, the x-oaiMeta
path key uses "create-item" which mismatches the navigation entry that
references "create-items", breaking documentation links; update the
x-oaiMeta.path value to "create-items" to match the nav, and optionally rename
the associated title to a plural/multi-item form (e.g., "Create items" or
"Create multiple items") so the title reflects multi-item support and stays
consistent with the nav.
| ComputerScreenshotContent: | ||
| title: Computer screenshot | ||
| required: | ||
| - type | ||
| - image_url | ||
| - file_id | ||
| type: object | ||
| properties: | ||
| file_id: | ||
| type: string | ||
| description: The identifier of an uploaded file that contains the screenshot. | ||
| nullable: true | ||
| image_url: | ||
| type: string | ||
| description: The URL of the screenshot image. | ||
| nullable: true | ||
| type: | ||
| enum: | ||
| - computer_screenshot | ||
| type: string | ||
| description: 'Specifies the event type. For a computer screenshot, this property is always set to `computer_screenshot`.' | ||
| default: computer_screenshot | ||
| x-stainless-const: true |
There was a problem hiding this comment.
Required but nullable properties conflict; model should allow exactly one of image_url or file_id.
ComputerScreenshotContent requires both image_url and file_id yet marks both nullable. That combination is contradictory and weakens validation.
- ComputerScreenshotContent:
+ ComputerScreenshotContent:
title: Computer screenshot
- required:
- - type
- - image_url
- - file_id
+ required:
+ - type
type: object
properties:
file_id:
type: string
description: The identifier of an uploaded file that contains the screenshot.
- nullable: true
image_url:
type: string
description: The URL of the screenshot image.
- nullable: true
type:
enum:
- computer_screenshot
type: string
description: 'Specifies the event type. For a computer screenshot, this property is always set to `computer_screenshot`.'
default: computer_screenshot
x-stainless-const: true
+ oneOf:
+ - required: [file_id]
+ - required: [image_url]📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| ComputerScreenshotContent: | |
| title: Computer screenshot | |
| required: | |
| - type | |
| - image_url | |
| - file_id | |
| type: object | |
| properties: | |
| file_id: | |
| type: string | |
| description: The identifier of an uploaded file that contains the screenshot. | |
| nullable: true | |
| image_url: | |
| type: string | |
| description: The URL of the screenshot image. | |
| nullable: true | |
| type: | |
| enum: | |
| - computer_screenshot | |
| type: string | |
| description: 'Specifies the event type. For a computer screenshot, this property is always set to `computer_screenshot`.' | |
| default: computer_screenshot | |
| x-stainless-const: true | |
| ComputerScreenshotContent: | |
| title: Computer screenshot | |
| required: | |
| - type | |
| type: object | |
| properties: | |
| file_id: | |
| type: string | |
| description: The identifier of an uploaded file that contains the screenshot. | |
| image_url: | |
| type: string | |
| description: The URL of the screenshot image. | |
| type: | |
| enum: | |
| - computer_screenshot | |
| type: string | |
| description: 'Specifies the event type. For a computer screenshot, this property is always set to `computer_screenshot`.' | |
| default: computer_screenshot | |
| x-stainless-const: true | |
| oneOf: | |
| - required: [file_id] | |
| - required: [image_url] |
🤖 Prompt for AI Agents
In src/libs/tryAGI.OpenAI/openapi.yaml around lines 9983 to 10005, the
ComputerScreenshotContent schema currently marks both image_url and file_id as
nullable and effectively allows both/missing values; change the schema to
require exactly one of them by replacing the dual nullable approach with a oneOf
constraint: define two alternatives (one with required: [image_url] and
image_url typed as string/uri, the other with required: [file_id] and file_id
typed as string) and remove the conflicting "both nullable" requirement so
validation enforces that exactly one of image_url or file_id is provided.
| enum: | ||
| - list | ||
| description: 'The type of object returned, must be `list`.' | ||
| x-stainless-const: true | ||
| description: A list of Conversation items. |
There was a problem hiding this comment.
Missing type on enum field will fail OAS validation.
ConversationItemList.object enumerates 'list' but omits type: string.
object:
enum:
- list
+ type: string
description: 'The type of object returned, must be `list`.'
x-stainless-const: true📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| enum: | |
| - list | |
| description: 'The type of object returned, must be `list`.' | |
| x-stainless-const: true | |
| description: A list of Conversation items. | |
| object: | |
| enum: | |
| - list | |
| type: string | |
| description: 'The type of object returned, must be `list`.' | |
| x-stainless-const: true |
🤖 Prompt for AI Agents
In src/libs/tryAGI.OpenAI/openapi.yaml around lines 10431 to 10435, the
ConversationItemList.object lists an enum value 'list' but omits the required
type for the enum field; add "type: string" to that schema property (the enum
that contains 'list') so the OpenAPI spec validates (ensure the enum property
has type: string and the enum: ['list'] under the correct schema key).
| metadata: | ||
| description: "Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format, and querying for objects via API or the dashboard.\n Keys are strings with a maximum length of 64 characters. Values are strings with a maximum length of 512 characters." | ||
| object: |
There was a problem hiding this comment.
🛠️ Refactor suggestion
metadata field lacks a schema; should reference Metadata or MetadataParam.
Currently only a description is provided. Add a $ref for strong typing and generator support.
metadata:
- description: "Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format, and querying for objects via API or the dashboard.
- Keys are strings with a maximum length of 64 characters. Values are strings with a maximum length of 512 characters."
+ $ref: '#/components/schemas/Metadata'(If you intend request-vs-response distinction, consider using MetadataParam in requests and Metadata in responses consistently.)
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| metadata: | |
| description: "Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format, and querying for objects via API or the dashboard.\n Keys are strings with a maximum length of 64 characters. Values are strings with a maximum length of 512 characters." | |
| object: | |
| metadata: | |
| $ref: '#/components/schemas/Metadata' |
🤖 Prompt for AI Agents
In src/libs/tryAGI.OpenAI/openapi.yaml around lines 10463 to 10465, the metadata
field only has a description and lacks a schema reference; update the metadata
property to include a $ref pointing to the appropriate component schema (use
#/components/schemas/MetadataParam for request payloads or
#/components/schemas/Metadata for responses) so that the field is strongly typed
and generators can pick up the structure; ensure the $ref replaces or is added
alongside the description (keeping the description if desired) and that
indentation and YAML syntax match surrounding schema entries.
| anyOf: | ||
| - $ref: '#/components/schemas/InputTextContent' | ||
| - $ref: '#/components/schemas/InputImageContent' | ||
| - $ref: '#/components/schemas/InputFileContent' | ||
| discriminator: | ||
| propertyName: type | ||
| InputFileContent: | ||
| title: Input file | ||
| required: | ||
| - type | ||
| type: object | ||
| properties: | ||
| file_data: | ||
| type: string | ||
| description: "The content of the file to be sent to the model.\n" | ||
| file_id: | ||
| type: string | ||
| description: The ID of the file to be sent to the model. | ||
| nullable: true | ||
| file_url: | ||
| type: string | ||
| description: The URL of the file to be sent to the model. | ||
| filename: | ||
| type: string | ||
| description: The name of the file to be sent to the model. | ||
| type: | ||
| enum: | ||
| - input_file | ||
| type: string | ||
| description: The type of the input item. Always `input_file`. | ||
| default: input_file | ||
| x-stainless-const: true | ||
| description: "An audio input to the model.\n" | ||
| InputContent: | ||
| anyOf: | ||
| - $ref: '#/components/schemas/InputTextContent' | ||
| - $ref: '#/components/schemas/InputImageContent' | ||
| - $ref: '#/components/schemas/InputFileContent' | ||
| discriminator: | ||
| propertyName: type | ||
| InputFileContent: | ||
| description: A file input to the model. | ||
| InputFileContent-2: |
There was a problem hiding this comment.
🛠️ Refactor suggestion
Require at least one of file_data, file_id, or file_url for InputFileContent.
As written, type is the only required property, allowing empty file content. Enforce a minimal valid payload.
InputFileContent:
title: Input file
required:
- type
type: object
properties:
file_data:
type: string
description: "The content of the file to be sent to the model.
"
file_id:
type: string
description: The ID of the file to be sent to the model.
nullable: true
file_url:
type: string
description: The URL of the file to be sent to the model.
filename:
type: string
description: The name of the file to be sent to the model.
type:
enum:
- input_file
type: string
description: The type of the input item. Always `input_file`.
default: input_file
x-stainless-const: true
description: A file input to the model.
+ oneOf:
+ - required: [file_id]
+ - required: [file_url]
+ - required: [file_data]🤖 Prompt for AI Agents
In src/libs/tryAGI.OpenAI/openapi.yaml around lines 16630 to 16663, the
InputFileContent schema only requires "type" today, allowing empty file
payloads; update the schema to require at least one of file_data, file_id, or
file_url by adding an anyOf (or oneOf) constraint with three subschemas each
requiring one of those properties (or alternately a single schema with
minProperties and a dependent required rule), and keep "type" required — this
enforces a minimal valid payload while preserving existing fields.
| title: Input image | ||
| required: | ||
| - type | ||
| - image_url | ||
| - file_id | ||
| - detail | ||
| type: object | ||
| properties: | ||
| detail: | ||
| enum: | ||
| - low | ||
| - high | ||
| - auto | ||
| type: string | ||
| description: 'The detail level of the image to be sent to the model. One of `high`, `low`, or `auto`. Defaults to `auto`.' | ||
| file_id: | ||
| type: string | ||
| description: The ID of the file to be sent to the model. | ||
| nullable: true | ||
| image_url: | ||
| type: string | ||
| description: The URL of the image to be sent to the model. A fully qualified URL or base64 encoded image in a data URL. | ||
| nullable: true | ||
| type: | ||
| enum: | ||
| - input_image | ||
| type: string | ||
| description: The type of the input item. Always `input_image`. | ||
| default: input_image | ||
| x-stainless-const: true |
There was a problem hiding this comment.
InputImageContent-2 requires both image_url and file_id but marks both nullable; use oneOf and drop nullable.
This mirrors the screenshot issue; enforce exactly one source.
InputImageContent-2:
title: Input image
- required:
- - type
- - image_url
- - file_id
- - detail
+ required:
+ - type
+ - detail
type: object
properties:
detail:
enum:
- low
- high
- auto
type: string
description: 'The detail level of the image to be sent to the model. One of `high`, `low`, or `auto`. Defaults to `auto`.'
file_id:
type: string
description: The ID of the file to be sent to the model.
- nullable: true
image_url:
type: string
description: The URL of the image to be sent to the model. A fully qualified URL or base64 encoded image in a data URL.
- nullable: true
type:
enum:
- input_image
type: string
description: The type of the input item. Always `input_image`.
default: input_image
x-stainless-const: true
+ oneOf:
+ - required: [file_id]
+ - required: [image_url]📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| title: Input image | |
| required: | |
| - type | |
| - image_url | |
| - file_id | |
| - detail | |
| type: object | |
| properties: | |
| detail: | |
| enum: | |
| - low | |
| - high | |
| - auto | |
| type: string | |
| description: 'The detail level of the image to be sent to the model. One of `high`, `low`, or `auto`. Defaults to `auto`.' | |
| file_id: | |
| type: string | |
| description: The ID of the file to be sent to the model. | |
| nullable: true | |
| image_url: | |
| type: string | |
| description: The URL of the image to be sent to the model. A fully qualified URL or base64 encoded image in a data URL. | |
| nullable: true | |
| type: | |
| enum: | |
| - input_image | |
| type: string | |
| description: The type of the input item. Always `input_image`. | |
| default: input_image | |
| x-stainless-const: true | |
| InputImageContent-2: | |
| title: Input image | |
| required: | |
| - type | |
| - detail | |
| type: object | |
| properties: | |
| detail: | |
| enum: | |
| - low | |
| - high | |
| - auto | |
| type: string | |
| description: 'The detail level of the image to be sent to the model. One of `high`, `low`, or `auto`. Defaults to `auto`.' | |
| file_id: | |
| type: string | |
| description: The ID of the file to be sent to the model. | |
| image_url: | |
| type: string | |
| description: The URL of the image to be sent to the model. A fully qualified URL or base64 encoded image in a data URL. | |
| type: | |
| enum: | |
| - input_image | |
| type: string | |
| description: The type of the input item. Always `input_image`. | |
| default: input_image | |
| x-stainless-const: true | |
| oneOf: | |
| - required: [file_id] | |
| - required: [image_url] |
🤖 Prompt for AI Agents
In src/libs/tryAGI.OpenAI/openapi.yaml around lines 16718 to 16747, the
InputImageContent-2 schema currently allows both image_url and file_id as
nullable which permits neither or both; replace those nullable properties with a
oneOf construct that enforces exactly one source: define two alternatives, one
requiring image_url and the other requiring file_id, remove nullable on both
properties, and ensure the top-level schema uses oneOf (not both required) so
validation fails if neither or both are provided.
| description: "An array of content items generated by the model.\n\n- The length and order of items in the `output` array is dependent\n on the model's response.\n- Rather than accessing the first item in the `output` array and\n assuming it's an `assistant` message with the content generated by\n the model, you might consider using the `output_text` property where\n supported in SDKs.\n" | ||
| output_text: | ||
| type: string | ||
| description: "SDK-only convenience property that contains the aggregated text output \nfrom all `output_text` items in the `output` array, if any are present. \nSupported in the Python and JavaScript SDKs.\n" | ||
| description: "SDK-only convenience property that contains the aggregated text output\nfrom all `output_text` items in the `output` array, if any are present.\nSupported in the Python and JavaScript SDKs.\n" | ||
| nullable: true | ||
| x-oaiSupportedSDKs: | ||
| - python |
There was a problem hiding this comment.
Inconsistent SDK support metadata vs description for output_text.
Description mentions Python and JavaScript, but x-oaiSupportedSDKs lists only python. Add javascript.
x-oaiSupportedSDKs:
- - python
+ - python
+ - javascript📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| description: "An array of content items generated by the model.\n\n- The length and order of items in the `output` array is dependent\n on the model's response.\n- Rather than accessing the first item in the `output` array and\n assuming it's an `assistant` message with the content generated by\n the model, you might consider using the `output_text` property where\n supported in SDKs.\n" | |
| output_text: | |
| type: string | |
| description: "SDK-only convenience property that contains the aggregated text output \nfrom all `output_text` items in the `output` array, if any are present. \nSupported in the Python and JavaScript SDKs.\n" | |
| description: "SDK-only convenience property that contains the aggregated text output\nfrom all `output_text` items in the `output` array, if any are present.\nSupported in the Python and JavaScript SDKs.\n" | |
| nullable: true | |
| x-oaiSupportedSDKs: | |
| - python | |
| x-oaiSupportedSDKs: | |
| - python | |
| - javascript |
🤖 Prompt for AI Agents
In src/libs/tryAGI.OpenAI/openapi.yaml around lines 22071 to 22077, the property
output_text has a description stating support for Python and JavaScript but the
x-oaiSupportedSDKs metadata lists only "python"; update x-oaiSupportedSDKs to
include "javascript" as well (add the string "javascript" to the array),
ensuring casing matches existing entries and keep YAML formatting/alignment
consistent with surrounding entries.
Summary by CodeRabbit
New Features
Breaking Changes
Documentation